Covering Number Bounds of Certain Regularized Linear Function Classes

نویسنده

  • Tong Zhang
چکیده

Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In many of these theoretical studies, the concept of covering numbers played an important role. It is thus useful to study covering numbers for linear function classes. In this paper, we investigate two closely related methods to derive upper bounds on these covering numbers. The first method, already employed in some earlier studies, relies on the so-called Maurey’s lemma; the second method uses techniques from the mistake bound framework in online learning. We compare results from these two methods, as well as their consequences in some learning formulations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Some Theoretical Results Concerning the Convergence of Compositions of Regularized Linear Functions

Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In this paper, we extend some theoretical results in this area by deriving dimensional independent covering number bounds for regularized linear functions under certain regularization conditions. We show that such bounds lead to a class of new methods...

متن کامل

On a linear combination of classes of harmonic $p-$valent functions defined by certain modified operator

In this paper we obtain coefficient characterization‎, ‎extreme points and‎ ‎distortion bounds for the classes of harmonic $p-$valent functions‎ ‎defined by certain modified operator‎. ‎Some of our results improve‎ ‎and generalize previously known results‎.

متن کامل

Analysis of Regularized Linear Functions for Classification Problems

Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In this paper, we extend some theoretical results in this area by providing convergence analysis for regularized linear functions with an emphasis on classi cation problems. The class of methods we study in this paper generalize support vector machine...

متن کامل

Generalization Bounds and Learning Rates for Regularized Principal Manifolds

We derive uniform convergence bounds and learning rates for regularized principal manifolds. This builds on previous work of Kegl et al., however we are able to obtain stronger bounds taking advantage of the decomposition of the principal manifold in terms of kernel functions. In particular, we are able to give bounds on the covering numbers which are independent of the number of basis function...

متن کامل

Bounds on the outer-independent double Italian domination number

An outer-independent double Italian dominating function (OIDIDF)on a graph $G$ with vertex set $V(G)$ is a function$f:V(G)longrightarrow {0,1,2,3}$ such that if $f(v)in{0,1}$ for a vertex $vin V(G)$ then $sum_{uin N[v]}f(u)geq3$,and the set $ {uin V(G)|f(u)=0}$ is independent. The weight ofan OIDIDF $f$ is the value $w(f)=sum_{vin V(G)}f(v)$. Theminimum weight of an OIDIDF on a graph $G$ is cal...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 2  شماره 

صفحات  -

تاریخ انتشار 2002